AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Sinhala pretraining

# Sinhala pretraining

Sinhalaberto
This is a relatively small model trained on the deduplicated OSCAR Sinhala dataset, providing foundational support for the low-resource Sinhala language.
Large Language Model Other
S
keshan
34
1
Sinbert Small
MIT
SinBERT is a model pretrained on a large Sinhala monolingual corpus (sin-cc-15M) based on the RoBERTa architecture, suitable for Sinhala text processing tasks.
Large Language Model Transformers Other
S
NLPC-UOM
126
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase